Web Survey Bibliography
Survey research has historically relied on a probabilistic model to underlie its sampling frame. With rare exception online research is non
‐probabilistic. Research without the safety net of a probabilistic frame raises all kinds of alarms. Challenges as to the reliability of online research has become a growing crescendo as the ‐probabilistic nature of online research has become evident. However, not all sampling frames must be probabilistic. Unfortunately, no such standard metrics exist to track reliability in online sampling. In fact, whether they are access panels or social networks there are no standardized means of balancing panels or even comparing them. To confound the situation the commercially used convenience panels are vastly different from each other (Gittelman and Trimarchi, CASRO Panel Conference, February 2009, paper available). These differences are so far reaching that those who elect to use these sample sources are not only without a safety net, they are at considerable professional risk. We have completed analysis of eighteen American panels and have found that respondent aging, frequency of professional responders, other satisficing behaviors as well as dramatic differences between sociologic, psychographic and buying behavior segmentations make for a cacophony of differences seemingly impossible to correct. ‐panel comparisons themselves are rare with data from a very few having been presented on any scale. ‐liners, invalids, inconsistencies, etc.] for which we have developed standard quantitative measures, and (2) a mechanism for developing a family of sampling standards based upon segmentation by key variables such as, but not limited to, media, purchasing and psychographics. It is the new availability of global data that allows us to present universal standards that help us meet necessary requirements that are our focus in this conference. In addition, to measuring performance, we believe that there are three key requirements for standard panel metrics including: (1) the ability to capture panel performance variations consistent with the differing needs of sample users, (i.e. a broadcasting company might wish to anchor its sampling frame to media segments); (2) The ability to create a data base that is retrospective in that new sample sources can be added to the database without repeating the analysis and (3) a focus on indices that are pragmatic in their measure (i.e. We always view buying behavior as the most pragmatic.)
non
In this study we will present the results of an extensive global study covering forty countries. Within each country panels will be compared using a 17 minute questionnaire, 400 completes per panel. We hope to present five or more providers per market. No such extensive comparison has been done on a global basis. In fact, inter
Preliminary data (24000 interviews) shows evolutionary trends in convenience panel development. Between panel differences appear more extreme in the United States than in other markets.
We are proposing two sets of practices: (1) using panel performance metrics [professionals, speeders, straight
In this talk, we propose to use segmentation analysis as a new metric that will allow us to anchor online data in a new non probabilistic sampling frame. It is the existence of global data that gives us a rare opportunity to experiment with this new methodology. Our goal is to use segmentation in each country to create a fingerprint that can be consistently maintained by blending panels. By minimizing the variability from the segments through optimization and panel combination we will establish a means for stabilizing online data irrespective of the panels and sourcing modes from which they draw their origin. We cannot stabilize online data unless we provide it with a reference point to anchor itself; the segments are that anchor. As the sourcing models continue to shift, panels will age and shift with them; we need a reliable anchor that rises above these problems. It is essential that we explore tools to measure these changes. Without a means of comparison we cannot expect to measure drift nor can we expect to have a platform for predicting the future. We do not profess to be on the road to a new probabilistic framework but rather a platform for comparison and continuity. We believe that there is a theoretical population online that can serve this purpose. Using the database we have gathered that includes respondents from over 160 global panels (64,000 interviews) distributed among 40 global markets we shall introduce new methods to build “perspective”.
Based on this we will use our segmentation models as a means of creating a “convenience” sampling frame by averaging segments into a “Grand Mean.” Using optimization models we will select convenience panels that best reflect the grand mean and the proportions by which they best fit together. We shall give evidence for the efficiency of these strategies.
Conference homepage (abstract)
Web survey bibliography - Other (439)
- A comparison of surveys using different modes of data collection; 2010; Revilla, M., Saris, W. E.
- Examining the effects of website-induced flow in professional sporting team websites; 2010; O'Cass, A., Carlson, J.
- Research into questionnaire design - A summary of the literature; 2010; Lietz, P.
- College Students' Response Rate to an Incentivized Combination of Postal and Web-Based Health Survey; 2010; Balajti I., Daragó, L., Ádány, R., Kósa, K.
- Improving the response rate and quality in Web-based surveys through the personalization and frequency...; 2010; Muñoz-Leiva, F., Sánchez-Fernández, J., Montoro-Ríos, F. J., Ibáñez-Zapata, J. A.
- What are participants doing while filling in an online questionnaire: A paradata collection tool and...; 2010; Stieger, S., Reips, U.-D.
- ESS Handbook for Quality Reports; 2009
- ESS Standard for Quality Reports; 2009
- MarketTools TrueSample; 2009
- ISO 26362 Access panels in market, opinion, and social research-Vocabulary and service requirements; 2009
- Stochastic properties of the Internet sample; 2009; Getka-Wilczynska, E.
- Web based survey: an emerging tool; 2009; Srivenkataramana, T., Saisree, M.
- The impact of gender in e-mailed survey invitations; 2009; Derham, P.
- The Coverage Bias of Mobile Web Surveys Across European Countries ; 2009; Fuchs, M., Busse, B.
- Interactivity in self-administered surveys. Influence on respondents' experience; 2009; Suarez Vazquez, A., Garcia Rodriguez, N., Alvarez, M. B.
- Metrics for panel contribution: a non probabilistic platform; 2009; Gittelmam, S. H., Trimarchi, E.
- Mode effects in Switzerland: non‐response and measurement error on the European Social Survey; 2009; Roberts, C.
- Reason analysis: an ambitious alternative for mixed‐mode survey design; 2009; Jerabek, H.
- Response rates in multi actor surveys; 2009; Pasteels, I., Ponnet, K., Mortelmans, D.
- Unit non‐response in panel surveys: empirical finding from an experiment; 2009; Haunberger, S.
- Computer-Assisted Audio Recording (CARI): Repurposing a Tool for Evaluating Comparative Instrument Design...; 2009; Edwards, B., Hicks, W., Tourangeau, K., Harris-Kojetin, L., Moss, A.
- Comparison between Liss panel (web) and ESS data (face to face); 2009; Revilla, M., Saris, W. E.
- The influence of the field time on data quality in list-based Web surveys; 2009; Goeritz, A., Stieger, S.
- Why don’t all Businesses report on Web?; 2009; Haraldsen, G.
- Turning Grid Questions into Sequences in Business Web Surveys; 2009; Haraldsen, G., Bergstrøm, Y.
- Visual Design Effects on Respondents’ Behavior in Web-Surveys; 2009; Greinoecker, A.
- Applying theory to structure respondents' stated motivations for participating in web surveys; 2009; Han, V., Albaum, G., Wiley, J. B., Thirkell, P.
- Web-based survey attracted age-biased sample with more severe illness than paper-based survey; 2009; Klovning, A., Sandvik, H., Hunskaar, S.
- Online Election Surveys: Keeping the Voters Honest? ; 2009; Gibson, R., McAllister, I.
- A recipe for effective participation rates for web-based surveys ; 2009; Bennett, L., Nair, C. S.
- Pause Mechanism in Complex Online Surveys; 2009; Milewski, J.
- Response Formats in Cross-cultural Comparisons in Web-based Surveys; 2009; Thomas, R. K.l, Terhanian, G., Funke, F.
- Relevance Of Health-Related Online-Information In Offline- And Online-Samples; 2009; Stetina, B. U., McElheney, J., Lehenbauer, M., Hinterberger, E., Pintzinger, N., Kryspin-Exner, I.
- Three Different Designs of Type Ranking‐Questions; 2009; Sackl, A.
- Gay and Lesbian People: The Use of Online Communication Services; 2009; Lehenbauer, M., Stetina, B. U., Kryspin-Exner, I.
- An Online Study on Coping with Anxiety and Disease-Specific Internet Use in Panic Attack Sufferers; 2009; König, D., Hiebler, C., Kryspin-Exner, I.
- Distortion of demographics through technically induced dropout in restricted online surveys; 2009; Voracek, M., Stieger, S., Goeritz, A.
- An Internet-based Study on Coping with Illness and Attitudes towards Online Health Care in Cancer Patients...; 2009; Setz, J., König, D., Kryspin-Exner, I.
- WebEXEC: A Short Self-Report Measure of Executive Function Suitable for Administration via the Internet...; 2009; Buchanan, T., Heffernan, T. M., Parrott, A. C., Ling, J., Rodgers, J., Scholey, A. B.
- Let's go formative: Continuous student ratings with Web 2.0 application Twitter; 2009; Burger, C., Stieger, S.
- Self-Efficacy Of Online Health Seekers; 2009; Stetina, B. U., Schramel, C., Lehenbauer, M., Schawill, W., Kryspin-Exner, I.
- Diffusion of Mobile Services Adoption in Taiwan; 2009; Doong, H.-S., Wang, H.-C.
- Verbal Vs Visual Response Options: Reconciling Meanings Conveyed by a Computer Aided Visual Rating Scale...; 2009; Garland, P., Cape, P.
- Increasing response rates in list based samples; 2009; Keusch, F., Kurz, H., Penzkofer, P.
- Implementation of a reaction time tool for brand measurement at Swisscom; 2009; Paar, I., Urbahn, J.
- Measuring Network Quality: Strengths and Weaknesses of different Evaluation Methods (SMS, w@p and web...; 2009; Wallisch, A., Schwab, H.
- Large Scale Digital Data Collection in Developing Countries: Is The Time Right? ; 2009; Hattas, M., Cronje, M., Berard, O.
- Implementation of web-based data-collection channel eSTAT for economic entities; 2009; Sillajoe, T.
- Personality on Social Network Sites: An Application of the Five Factor Model; 2009; Wehrli, S.
- Use of Online Interviews in the Underlying Discourse Unveiling Method (UDUM); 2009; Nicolaci-da-Costa, A. M., Romao-Dias, D., Di Luccio, F.